Estimation of Squared-Loss Mutual Information from Positive and Unlabeled Data

نویسندگان

  • Tomoya Sakai
  • Gang Niu
  • Masashi Sugiyama
چکیده

Capturing input-output dependency is an important task in statistical data analysis. Mutual information (MI) is a vital tool for this purpose, but it is known to be sensitive to outliers. To cope with this problem, a squared-loss variant of MI (SMI) was proposed, and its supervised estimator has been developed. On the other hand, in real-world classification problems, it is conceivable that only positive and unlabeled (PU) data are available. In this paper, we propose a novel estimator of SMI only from PU data, and prove its optimal convergence to true SMI. Based on the PU-SMI estimator, we further propose a dimension reduction method which can be executed without estimating the class-prior probabilities of unlabeled data. Such PU class-prior estimation is often required in PU classification algorithms, but it is unreliable particularly in high-dimensional problems, yielding a biased classifier. Our dimension reduction method significantly boosts the accuracy of PU class-prior estimation, as demonstrated through experiments. We also develop a method of independent testing based on our PU-SMI estimator and experimentally show its superiority.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Maximum mutual information estimation with unlabeled data for phonetic classification

This paper proposes a new training framework for mixed labeled and unlabeled data and evaluates it on the task of binary phonetic classification. Our training objective function combines Maximum Mutual Information (MMI) for labeled data and Maximum Likelihood (ML) for unlabeled data. Through the modified training objective, MMI estimates are smoothed with ML estimates obtained from unlabeled da...

متن کامل

Bayes, E-Bayes and Robust Bayes Premium Estimation and Prediction under the Squared Log Error Loss Function

In risk analysis based on Bayesian framework, premium calculation requires specification of a prior distribution for the risk parameter in the heterogeneous portfolio. When the prior knowledge is vague, the E-Bayesian and robust Bayesian analysis can be used to handle the uncertainty in specifying the prior distribution by considering a class of priors instead of a single prior. In th...

متن کامل

Computationally Efficient Estimation of Squared-Loss Mutual Information with Multiplicative Kernel Models

Squared-loss mutual information (SMI) is a robust measure of the statistical dependence between random variables. The sample-based SMI approximator called least-squares mutual information (LSMI) was demonstrated to be useful in performing various machine learning tasks such as dimension reduction, clustering, and causal inference. The original LSMI approximates the pointwise mutual information ...

متن کامل

Machine Learning with Squared-Loss Mutual Information

Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its P...

متن کامل

Analysis of Learning from Positive and Unlabeled Data

Learning a classifier from positive and unlabeled data is an important class of classification problems that are conceivable in many practical applications. In this paper, we first show that this problem can be solved by cost-sensitive learning between positive and unlabeled data. We then show that convex surrogate loss functions such as the hinge loss may lead to a wrong classification boundar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1710.05359  شماره 

صفحات  -

تاریخ انتشار 2017